2,140 research outputs found

    Risks and controls in the implementation of ERP systems

    Get PDF
    The implementation of ERP systems has been problematic for many organizations. Given the many reports of substantial failures, the implementation of packaged ERP software and associated changes in business processes has proved not to be an easy task. As many organizations have discovered, the implementation of ERP systems can be a monumental disaster unless the process is handled carefully. The aim of this study is to identify the risks and controls used in ERP implementations, with the objective to understand the ways in which organizations can minimize the business risks involved. By controlling and minimizing the major business risks in the first instance, the scene can be set for the successful implementation of an ERP system. The study was motivated by the significance, for both the research and practice communities, of understanding the risks and controls critical for the successful implementation of ERP systems. Following the development of a model of risks and controls, a field study of an ERP system implementation project in an organization was conducted to provide a limited test of the model. The results from the field study provided support for risks and controls identified in the literature. The results also identified several controls not mentioned in the reviewed literature. The study lays the foundation for further research into the risk/control framework so important for the success of the implementations of ERP systems.La implementación de los sistemas ERP ha sido problemática para muchas organizaciones. Dada la cantidad de informes sobre fallos considerables, la implementación de los paquetes de software ERP y los cambios asociados en los procesos de negocios han demostrado no ser tarea fácil. Como muchas organizaciones ya han descubierto, la implementación de los sistemas ERP pueden ser un desastre monumental a no ser que el proceso se lleve a cabo cuidadosamente. El objetivo de este estudio es identificar los riesgos y controles usados en las implementaciones de ERP, con el propósito de comprender la forma en que las organizaciones pueden minimizar los riesgos implicados en los negocios. Mediante el control y disminución de los riesgos en primer lugar, se puede preparar el escenario para una implementación exitosa de un sistema de ERP. El estudio fue motivado por la importancia de comprender y controlar los riesgos críticos (tanto para las comunidades de investigación como las que los practican), para la implementación exitosa de los sistemas ERP. Siguiendo el desarrollo de un modelo de riesgos y control, un estudio de campo de un proyecto de implementación de un sistema ERP se llevó a cabo en una organización para proporcionar una pequeña prueba del modelo. Los resultados de este estudio proporcionaron apoyo para los riesgos y controles identificados en la literatura. El estudio pavimenta las bases para ahondar en las investigaciones sobre el ámbito del riesgo/control que es tan importante para una implementación exitosa de los sistemas ERP

    Resources-Events-Agents Design Theory: A Revolutionary Approach to Enterprise System Design

    Get PDF
    Enterprise systems typically include constructs such as ledgers and journals with debit and credit entries as central pillars of the systems’ architecture due in part to accountants and auditors who demand those constructs. At best, structuring systems with such constructs as base objects results in the storing the same data at multiple levels of aggregation, which creates inefficiencies in the database. At worst, basing systems on such constructs destroys details that are unnecessary for accounting but that may facilitate decision making by other enterprise functional areas. McCarthy (1982) proposed the resources-events-agents (REA) framework as an alternative structure for a shared data environment more than thirty years ago, and scholars have further developed it such that it is now a robust design theory. Despite this legacy, the broad IS community has not widely researched REA. In this paper, we discuss REA’s genesis and primary constructs, provide a history of REA research, discuss REA’s impact on practice, and speculate as to what the future may hold for REA-based enterprise systems. We invite IS researchers to consider integrating REA constructs with other theories and various emerging technologies to help advance the future of information systems and business research

    Diagrammatic Attention Management and the Effect of Conceptual Model Structure on Cardinality Validation

    Get PDF
    Diagrams are frequently used to document various components of information systems, from the procedures established for user-system interaction, to the structure of the database at the system’s core. Past research has revealed that diagrams are not always used as effectively as their creators intend. This study proposes a theory of diagrammatic attention management to contribute to the exploration of diagram effectiveness. Based upon diagrammatic attention management, this study demonstrates that the type of diagram most commonly used to represent conceptual models is less effective than three other alternatives for validating the models’ cardinalities. Most conceptual models are documented using entity-relationship diagrams that include a full transaction cycle or module on a single page, i.e., an aggregate diagrammatic format. Participants in this study using three alternative representations (disaggregate diagrammatic, aggregate sentential, and disaggregate sentential) outperformed users of the aggregate diagrammatic format for cardinality validation. Results suggest that to facilitate effective use of aggregate diagrams, users need a mechanism by which to direct their attention while using the diagrams. If such an attention direction mechanism is not inherent in a diagram, it may need to be applied as an external tool, or the diagram may need to be disaggregated to facilitate use

    Pixels simultaneous detection probabilities and spatial resolution determination of pixelized detectors by means of correlation measurements

    Full text link
    A novel method to estimate the pixels simultaneous detection probability and the spatial resolution of pixelized detectors is proposed, which is based on the determination of the statistical correlations between detector neighbor pixels. The correlations are determined by means of noise variance measurement for a isolated pixels and the difference between neighbor pixels. The method is validated using images from the two different GE Senographe 2000D mammographic units. The pixelized detector has been irradiated using x-rays along its entire surface. It is shown that the pixel simultaneous detection probabilities can be estimated within accuracy 0.001 - 0.003, where the systematic error is estimated to be smaller than 0.005. The presampled two-dimensional point-spread function (PSF0) is determined using a single Gaussian and a sum of two Gaussian approximations. The obtained results for the presampled PSF0 show that the single Gaussian approximation is not appropriate, and the sum of two Gaussian approximations providing the best fit predicts the existence of a large (~50%) narrow component. Another proof of this fact is the latest simulation study of columnar indirect digital detectors by A. Badano et al. The sampled two-dimensional PSF is determined using Monte Carlo simulation for the L-shape uniform distributed acceptance function for different values of fill factors. The detector spatial resolution is estimated using sampled PSF and has values 54 and 58 mkm for two different units. The calculation of the presampled modulation transfer function based on the PSF0 estimation shows that the existing data can only be reproduced using a single Gaussian approximation and the usage of the sum of two Gaussian show significantly larger values in the higher frequency region for both units.Comment: 14 pages, 5 tables, 10 figures; added 1 figure and section 3.

    The use of cosmic muons in detecting heterogeneities in large volumes

    Full text link
    The muon intensity attenuation method to detect heterogeneities in large matter volumes is analyzed. Approximate analytical expressions to estimate the collection time and the signal to noise ratio, are proposed and validated by Monte Carlo simulations. Important parameters, including point spread function and coordinate reconstruction uncertainty are also estimated using Monte Carlo simulations.Comment: 8 pages, 11 figures, submetted to NIM

    Measurement of the Cross Section Asymmetry of the Reaction gp-->pi0p in the Resonance Energy Region Eg = 0.5 - 1.1 GeV

    Full text link
    The cross section asymmetry Sigma has been measured for the photoproduction of pi0-mesons off protons, using polarized photons in the energy range Eg = 0.5 - 1.1 GeV. The CM angular coverage is Theta = 85 - 125 deg with energy and angle steps of 25 MeV and 5 deg, respectively. The obtained Sigma data, which cover the second and third resonance regions, are compared with existing experimental data and recent phenomenological analyses. The influence of these measurements on such analyses is also considered

    Multiplicity distribution and spectra of negatively charged hadrons in Au+Au collisions at sqrt(s_nn) = 130 GeV

    Full text link
    The minimum bias multiplicity distribution and the transverse momentum and pseudorapidity distributions for central collisions have been measured for negative hadrons (h-) in Au+Au interactions at sqrt(s_nn) = 130 GeV. The multiplicity density at midrapidity for the 5% most central interactions is dNh-/deta|_{eta = 0} = 280 +- 1(stat)+- 20(syst), an increase per participant of 38% relative to ppbar collisions at the same energy. The mean transverse momentum is 0.508 +- 0.012 GeV/c and is larger than in central Pb+Pb collisions at lower energies. The scaling of the h- yield per participant is a strong function of pt. The pseudorapidity distribution is almost constant within |eta|<1.Comment: 6 pages, 3 figure
    corecore